首页> 外文OA文献 >Hearing and seeing meaning in noise: Alpha, beta and gamma oscillations predict gestural enhancement of degraded speech comprehension
【2h】

Hearing and seeing meaning in noise: Alpha, beta and gamma oscillations predict gestural enhancement of degraded speech comprehension

机译:听觉和看到噪音中的含义:alpha,beta和gamma振荡预测降低语音理解的手势增强

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

During face-to-face communication, listeners integrate speech with gestures. The semantic information conveyed by iconic gestures (e.g., a drinking gesture) can aid speech comprehension in adverse listening conditions. In this magnetoencephalography (MEG) study, we investigated the spatiotemporal neural oscillatory activity associated with gestural enhancement of degraded speech comprehension. Participants watched videos of an actress uttering clear or degraded speech, accompanied by a gesture or not and completed a cued-recall task after watching every video. When gestures semantically disambiguated degraded speech comprehension, an alpha and beta power suppression and a gamma power increase revealed engagement and active processing in the hand-area of the motor cortex, the extended language network (LIFG/pSTS/STG/MTG), medial temporal lobe, and occipital regions. These observed low- and high-frequency oscillatory modulations in these areas support general unification, integration and lexical access processes during online language comprehension, and simulation of and increased visual attention to manual gestures over time. All individual oscillatory power modulations associated with gestural enhancement of degraded speech comprehension predicted a listener's correct disambiguation of the degraded verb after watching the videos. Our results thus go beyond the previously proposed role of oscillatory dynamics in unimodal degraded speech comprehension and provide first evidence for the role of low- and high-frequency oscillations in predicting the integration of auditory and visual information at a semantic level.
机译:在面对面交流中,听众将语音与手势结合在一起。由图标手势(例如,饮酒手势)传达的语义信息可以在不利的收听条件下帮助语音理解。在这项磁脑电图(MEG)研究中,我们调查了时空神经振荡活动与降低语音理解能力的手势增强有关。参与者观看了女演员的语音清晰或退化的视频,并伴有或没有手势,并在观看完每个视频后完成了提示式召回任务。当手势在语义上消除了语音理解能力的歧义时,α和β功率抑制以及γ功率增加表明运动皮层,扩展语言网络(LIFG / pSTS / STG / MTG),内侧颞叶的手部区域参与并处于主动处理状态裂片和枕骨区域。这些在这些区域观察到的低频和高频振荡调制,支持在线语言理解过程中的一般统一,集成和词汇访问过程,并随着时间的推移模拟并增强了对手动手势的视觉关注。与降低的语音理解的手势增强相关的所有单独的振荡功率调制都可以在观看视频后预测听众对降低的动词的正确歧义消除。因此,我们的研究结果超出了先前提出的振荡动力学在单峰降级语音理解中的作用,并为低频和高频振荡在语义水平上预测听觉和视觉信息的整合提供了初步证据。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号